📚 node [[long_short term_memory_(lstm)|long short term memory (lstm)]]
Welcome! Nobody has contributed anything to 'long_short term_memory_(lstm)|long short term memory (lstm)' yet. You can:
  • Write something in the document below!
    • There is at least one public document in every node in the Agora. Whatever you write in it will be integrated and made available for the next visitor to read and edit.
  • Write to the Agora from social media.
    • If you follow Agora bot on a supported platform and include the wikilink [[long_short term_memory_(lstm)|long short term memory (lstm)]] in a post, the Agora will link it here and optionally integrate your writing.
  • Sign up as a full Agora user.
    • As a full user you will be able to contribute your personal notes and resources directly to this knowledge commons. Some setup required :)
⥅ related node [[long short term memory lstm]]
⥅ related node [[long_short term_memory_(lstm)]]
⥅ node [[long-short-term-memory-lstm]] pulled by Agora

long short-term memory - LSTM

Go to [[Week 2 - Introduction]] or back to the [[Main AI Page]] Part of the pages on [[Artificial Intelligence/Introduction to AI/Week 2 - Introduction/Natural Language Processing]] and [[Attention Mechanism]].

A deep learning architecture in the same way [[CNNS - Convolutional neural networks]] are.

These algorithms are able to forget, according to the RNN section of the beginners' guide to NLP

According to wikipedia:

A common LSTM unit is composed of a cell, an input gate, an output gate and a forget gate. The cell remembers values over arbitrary time intervals and the three gates regulate the flow of information into and out of the cell. 

Source - Wikipedia.

An LSTM is made up of a:

  • cell
  • an input gate
  • an output gate
  • a forget gate

A video on how LSTMs work

The difference between LSTMs and [[GRUs]]

An LSTM and a GRU side-by-side

📖 stoas
⥱ context